skip to main content


Search for: All records

Creators/Authors contains: "Page, Xinru"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. We conducted a user study with 380 Android users, profiling them according to two key privacy behaviors: the number of apps installed, and the Dangerous permissions granted to those apps. We identified four unique privacy profiles: 1) Privacy Balancers (49.74% of participants), 2) Permission Limiters (28.68%), 3) App Limiters (14.74%), and 4) the Privacy Unconcerned (6.84%). App and Permission Limiters were significantly more concerned about perceived surveillance than Privacy Balancers and the Privacy Unconcerned. App Limiters had the lowest number of apps installed on their devices with the lowest intention of using apps and sharing information with them, compared to Permission Limiters who had the highest number of apps installed and reported higher intention to share information with apps. The four profiles reflect the differing privacy management strategies, perceptions, and intentions of Android users that go beyond the binary decision to share or withhold information via mobile apps. 
    more » « less
  2. The prevalence of smartphones in our society warrants more research on understanding the characteristics of users and their information privacy behaviors when using mobile apps. This paper investigates the antecedents and consequences of “power use” (i.e., the competence and desire to use technology to its fullest) in the context of informational privacy. In a study with 380 Android users, we examined how gender and users’ education level influence power use, how power use affects users’ intention to install apps and share information with them versus their actual privacy behaviors (i.e., based on the number of apps installed and the total number of “dangerous permission” requests granted to those apps). Our findings revealed an inconsistency in the effect of power use on users’ information privacy behaviors: While the intention to install apps and to share information with them increased with power use, the actual number of installed apps and dangerous permissions ultimately granted decreased with power use. In other words, although the self-reported intentions suggested the opposite, people who scored higher on the power use scale seemed to be more prudent about their informational privacy than people who scored lower on the power use scale. We discuss the implications of this inconsistency and make recommendations for reconciling smartphone users’ informational privacy intentions and behaviors. 
    more » « less
  3. null (Ed.)
    Various contact tracing approaches have been applied to help contain the spread of COVID-19, with technology-based tracing and human tracing among the most widely adopted. However, governments and communities worldwide vary in their adoption of digital contact tracing, with many instead choosing the human approach. We investigate how people perceive the respective benefits and risks of human and digital contact tracing through a mixed-methods survey with 291 respondents from the United States. Participants perceived digital contact tracing as more beneficial for protecting privacy, providing convenience, and ensuring data accuracy, and felt that human contact tracing could help provide security, emotional reassurance, advice, and accessibility. We explore the role of self-tracking technologies in public health crisis situations, highlighting how designs must adapt to promote societal benefit rather than just self-understanding. We discuss how future digital contact tracing can better balance the benefits of human tracers and technology amidst the complex contact tracing process and context. 
    more » « less
  4. null (Ed.)
    Abstract Smartphone location sharing is a particularly sensitive type of information disclosure that has implications for users’ digital privacy and security as well as their physical safety. To understand and predict location disclosure behavior, we developed an Android app that scraped metadata from users’ phones, asked them to grant the location-sharing permission to the app, and administered a survey. We compared the effectiveness of using self-report measures commonly used in the social sciences, behavioral data collected from users’ mobile phones, and a new type of measure that we developed, representing a hybrid of self-report and behavioral data to contextualize users’ attitudes toward their past location-sharing behaviors. This new type of measure is based on a reflective learning paradigm where individuals reflect on past behavior to inform future behavior. Based on data from 380 Android smartphone users, we found that the best predictors of whether participants granted the location-sharing permission to our app were: behavioral intention to share information with apps, the “FYI” communication style, and one of our new hybrid measures asking users whether they were comfortable sharing location with apps currently installed on their smartphones. Our novel, hybrid construct of self-reflection on past behavior significantly improves predictive power and shows the importance of combining social science and computational science approaches for improving the prediction of users’ privacy behaviors. Further, when assessing the construct validity of the Behavioral Intention construct drawn from previous location-sharing research, our data showed a clear distinction between two different types of Behavioral Intention: self-reported intention to use mobile apps versus the intention to share information with these apps. This finding suggests that users desire the ability to use mobile apps without being required to share sensitive information, such as their location. These results have important implications for cybersecurity research and system design to meet users’ location-sharing privacy needs. 
    more » « less
  5. This one-day workshop aims to explore ubiquitous privacy research and design in the context of mobile and IoT by facilitating discourse among scholars from the networked privacy and design communities. The complexity in modern socio-technical systems points to the potential of utilizing various design techniques (e.g., speculative design, design fiction, and research through design practices) in surfacing the potential consequences of novel technologies, particularly those that traditional user studies may not reveal. The results will shed light on future privacy designs for mobile and IoT technologies from both empirical and design perspectives. 
    more » « less
  6. Through a series of ACM SIGCHI workshops, we have built a research community of individuals dedicated to networked privacy--from identifying the key challenges to designing privacy solutions and setting a privacy-focused agenda for the future. In this workshop, we take an intentional pause to unpack the potential ethical questions and concerns this agenda might raise. Rather than strictly focusing on privacy as a state that is always desired--where more privacy is viewed unequivocally as "better"--we consider situations where privacy may not be optimal for researchers, end users, or society. We discuss the current research landscape, including the recent updates to the ACM's Code of Ethics, and how researchers and designers can make more informed decisions regarding ethics, privacy, and other competing values in privacy-related research and designs. Our workshop includes group discussions, breakout activities, and a panel of experts with diverse insights discussing topics related to privacy and ethics. 
    more » « less